A New Linear Dimensionality Reduction Technique Based on Chernoff Distance
نویسندگان
چکیده
A new linear dimensionality reduction (LDR) technique for pattern classification and machine learning is presented, which, though linear, aims at maximizing the Chernoff distance in the transformed space. The corresponding two-class criterion, which is maximized via a gradient-based algorithm, is presented and initialization procedures are also discussed. Empirical results of this and traditional LDR approaches combined with two well-known classifiers, linear and quadratic, on synthetic and real-life data show that the proposed criterion outperforms the traditional schemes.
منابع مشابه
On the Performance of Chernoff-Distance-Based Linear Dimensionality Reduction Techniques
We present a performance analysis of three linear dimensionality reduction techniques: Fisher’s discriminant analysis (FDA), and two methods introduced recently based on the Chernoff distance between two distributions, the Loog and Duin (LD) method, which aims to maximize a criterion derived from the Chernoff distance in the original space, and the one introduced by Rueda and Herrera (RH), whic...
متن کاملImproving Chernoff criterion for classification by using the filled function
Linear discriminant analysis is a well-known matrix-based dimensionality reduction method. It is a supervised feature extraction method used in two-class classification problems. However, it is incapable of dealing with data in which classes have unequal covariance matrices. Taking this issue, the Chernoff distance is an appropriate criterion to measure distances between distributions. In the p...
متن کاملLinear dimensionality reduction by maximizing the Chernoff distance in the transformed space
Linear dimensionality reduction (LDR) techniques are quite important in pattern recognition due to their linear time complexity and simplicity. In this paper, we present a novel LDR technique which, though linear, aims to maximize the Chernoff distance in the transformed space; thus, augmenting the class separability in such a space. We present the corresponding criterion, which is maximized vi...
متن کاملLinear Discriminant Analysis Algorithms
We propose new algorithms for computing linear discriminants to perform data dimensionality reduction from R to R, with p < n. We propose alternatives to the classical Fisher’s Distance criterion, namely, we investigate new criterions based on the: Chernoff-Distance, J-Divergence and Kullback-Leibler Divergence. The optimization problems that emerge of using these alternative criteria are non-c...
متن کاملChernoff Dimensionality Reduction-Where Fisher Meets FKT
Well known linear discriminant analysis (LDA) based on the Fisher criterion is incapable of dealing with heteroscedasticity in data. However, in many practical applications we often encounter heteroscedastic data, i.e., within-class scatter matrices can not be expected to be equal. A technique based on the Chernoff criterion for linear dimensionality reduction has been proposed recently. The te...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2006